- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources4
- Resource Type
-
0001000003000000
- More
- Availability
-
31
- Author / Contributor
- Filter by Author / Creator
-
-
Zhang, Mingxuan (4)
-
Liang, Faming (2)
-
Sun, Yan (2)
-
Andrews, Justin L. (1)
-
Brown, Alexandra C. (1)
-
Chen, Tianyang (1)
-
Cheng, Chong (1)
-
Dincă, Mircea (1)
-
Grayson, Scott M (1)
-
Kampouri, Stavroula (1)
-
Miles, Ashley (1)
-
Nsengiyumva, Emmanuel M (1)
-
Oppenheim, Julius J. (1)
-
Payne, Michael T. (1)
-
Su, Yafei (1)
-
Sun, Junliang (1)
-
Wu, Yun (1)
-
Zamani, Masoud (1)
-
Zhang, Ziwen (1)
-
#Tyler Phillips, Kenneth E. (0)
-
- Filter by Editor
-
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
(submitted - in Review for IEEE ICASSP-2024) (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available July 15, 2026
-
Zhang, Mingxuan; Sun, Yan; Liang, Faming (, Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023)
-
Zhang, Mingxuan; Sun, Yan; Liang, Faming (, Journal of Data Science)Large pretrained transformer models have revolutionized modern AI applications with their state-of-the-art performance in natural language processing (NLP). However, their substantial parameter count poses challenges for real-world deployment. To address this, researchers often reduce model size by pruning parameters based on their magnitude or sensitivity. Previous research has demonstrated the limitations of magnitude pruning, especially in the context of transfer learning for modern NLP tasks. In this paper, we introduce a new magnitude-based pruning algorithm called mixture Gaussian prior pruning (MGPP), which employs a mixture Gaussian prior for regularization. MGPP prunes non-expressive weights under the guidance of the mixture Gaussian prior, aiming to retain the model’s expressive capability. Extensive evaluations across various NLP tasks, including natural language understanding, question answering, and natural language generation, demonstrate the superiority of MGPP over existing pruning methods, particularly in high sparsity settings. Additionally, we provide a theoretical justification for the consistency of the sparse transformer, shedding light on the effectiveness of the proposed pruning method.more » « less
-
Kampouri, Stavroula; Zhang, Mingxuan; Chen, Tianyang; Oppenheim, Julius J.; Brown, Alexandra C.; Payne, Michael T.; Andrews, Justin L.; Sun, Junliang; Dincă, Mircea (, Angewandte Chemie International Edition)Abstract We report a metal–organic framework (MOF) with a rare two‐dimensional (2D) secondary building unit (SBU). The SBU comprises mixed‐valent Fe2+and Fe3+metal ions bridged by oxygen atoms pertaining to the polytopic ligand 3,3′,4,4′,5,5′‐hexahydroxybiphenyl, which also define the iron‐oxide 2D layers. Overall, the anionic framework exhibits rare topology and evidences strong electronic communication between the mixed‐valence iron sites. These results highlight the importance of dimensionality control of MOF SBUs for discovering new topologies in reticular chemistry, and especially for improving electronic communication within the MOF skeleton.more » « less
An official website of the United States government
